Data Fusion with Entropic Priors

نویسندگان

  • Francesco Palmieri
  • Domenico Ciuonzo
چکیده

In classification problems, lack of knowledge of the prior distribution may make the application of Bayes’ rule inadequate. Uniform or arbitrary priors may often provide classification answers that, even in simple examples, may end up contradicting our common sense about the problem. Entropic priors, via application of the maximum entropy principle, seem to provide a much better answer and can be easily derived and applied to classification tasks when no more than the likelihood funtions are available. In this paper we present an application example in which the use of the entropic priors is compared to the results of the application of Dempster-Shafer theory.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Consistency of Sequence Classification with Entropic Priors

Entropic priors, recently revisited within the context of theoretical physics, were originally introduced for image processing and for general statistical inference. Entropic priors seem to represent a very promising approach to “objective” prior determination when such information is not available. The attention has been mostly limited to continuous parameter spaces and our focus in this work ...

متن کامل

Objective priors from maximum entropy in data classification

Lack of knowledge of the prior distribution in classification problems that operate on small data sets may make the application of Bayes’ rule questionable. Uniform or arbitrary priors may provide classification answers that, even in simple examples, may end up contradicting our common sense about the problem. Entropic priors (EPs), via application of the maximum entropy (ME) principle, seem to...

متن کامل

Entropic Priors for Discrete Probabilistic Networks and for Mixtures of Gaussians Models

The ongoing unprecedented exponential explosion of available computing power, has radically transformed the methods of statistical inference. What used to be a small minority of statisticians advocating for the use of priors and a strict adherence to bayes theorem, it is now becoming the norm across disciplines. The evolutionary direction is now clear. The trend is towards more realistic, flexi...

متن کامل

Probabilistic Factorization of Non-negative Data with Entropic Co-occurrence Constraints

Abstract. In this paper we present a probabilistic algorithm which factorizes non-negative data. We employ entropic priors to additionally satisfy that user specified pairs of factors in this model will have their cross entropy maximized or minimized. These priors allow us to construct factorization algorithms that result in maximally statistically different factors, something that generic non-...

متن کامل

Approximate Maximum A Posteriori Inference with Entropic Priors

In certain applications it is useful to fit multinomial distributions to observed data with a penalty term that encourages sparsity. For example, in probabilistic latent audio source decomposition one may wish to encode the assumption that only a few latent sources are active at any given time. The standard heuristic of applying an L1 penalty is not an option when fitting the parameters to a mu...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2010